Fast Solution of l1-Norm Minimization Problems When the Solution May Be Sparse

نویسندگان

  • David L. Donoho
  • Yaakov Tsaig
چکیده

The minimum `1-norm solution to an underdetermined system of linear equations y = Ax, is often, remarkably, also the sparsest solution to that system. This sparsity-seeking property is of interest in signal processing and information transmission. However, general-purpose optimizers are much too slow for `1 minimization in many large-scale applications. The Homotopy method was originally proposed by Osborne et al. for solving noisy overdetermined `1-penalized least squares problems. We here apply it to solve the noiseless underdetermined `1-minimization problem min ‖x‖1 subject to y = Ax. We show that Homotopy runs much more rapidly than general-purpose LP solvers when sufficient sparsity is present. Indeed, the method often has the following k-step solution property: if the underlying solution has only k nonzeros, the Homotopy method reaches that solution in only k iterative steps. When this property holds and k is small compared to the problem size, this means that `1 minimization problems with k-sparse solutions can be solved in a fraction of the cost of solving one full-sized linear system. We demonstrate this k-step solution property for two kinds of problem suites. First, incoherent matrices A, where off-diagonal entries of the Gram matrix AA are all smaller than M . If y is a linear combination of at most k ≤ (M−1 + 1)/2 columns of A, we show that Homotopy has the k-step solution property. Second, ensembles of d × n random matrices A. If A has iid Gaussian entries, then, when y is a linear combination of at most k < d/(2 log(n)) · (1 − n) columns, with n > 0 small, Homotopy again exhibits the k-step solution property with high probability. Further, we give evidence showing that for ensembles of d×n partial orthogonal matrices, including partial Fourier matrices, and partial Hadamard matrices, with high probability, the k-step solution property holds up to a dramatically higher threshold k, satisfying k/d < ρ̂(d/n), for a certain empirically-determined function ρ̂(δ). Our results imply that Homotopy can efficiently solve some very ambitious large-scale problems arising in stylized applications of error-correcting codes, magnetic resonance imaging, and NMR spectroscopy. Our approach also sheds light on the evident parallelism in results on `1 minimization and Orthogonal Matching Pursuit (OMP), and aids in explaining the inherent relations between Homotopy, LARS, OMP, and Polytope Faces Pursuit.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

RSP-Based Analysis for Sparsest and Least $\ell_1$-Norm Solutions to Underdetermined Linear Systems

Recently, the worse-case analysis, probabilistic analysis and empirical justification have been employed to address the fundamental question: When does l1-minimization find the sparsest solution to an underdetermined linear system? In this paper, a deterministic analysis, rooted in the classic linear programming theory, is carried out to further address this question. We first identify a necess...

متن کامل

RSP-Based Analysis for Sparsest and Least ℓ1-Norm Solutions to Underdetermined Linear Systems

Recently, the worse-case analysis, probabilistic analysis and empirical justification have been employed to address the fundamental question: When does l1-minimization find the sparsest solution to an underdetermined linear system? In this paper, a deterministic analysis, rooted in the classic linear programming theory, is carried out to further address this question. We first identify a necess...

متن کامل

Enhancing Sparsity by Reweighted l1 Minimization

It is now well understood that (1) it is possible to reconstruct sparse signals exactly from what appear to be highly incomplete sets of linear measurements and (2) that this can be done by constrained l1 minimization. In this paper, we study a novel method for sparse signal recovery that in many situations outperforms l1 minimization in the sense that substantially fewer measurements are neede...

متن کامل

Sparse and Robust Signal Reconstruction

Many problems in signal processing and statistical inference are based on finding a sparse solution to an undetermined linear system. The reference approach to this problem of finding sparse signal representations, on overcomplete dictionaries, leads to convex unconstrained optimization problems, with a quadratic term l2, for the adjustment to the observed signal, and a coefficient vector l1-no...

متن کامل

N ov 2 00 7 Enhancing Sparsity by Reweighted l 1 Minimization Emmanuel

It is now well understood that (1) it is possible to reconstruct sparse signals exactly from what appear to be highly incomplete sets of linear measurements and (2) that this can be done by constrained l1 minimization. In this paper, we study a novel method for sparse signal recovery that in many situations outperforms l1 minimization in the sense that substantially fewer measurements are neede...

متن کامل

An L1-norm method for generating all of efficient solutions of multi-objective integer linear programming problem

This paper extends the proposed method by Jahanshahloo et al. (2004) (a method for generating all the efficient solutions of a 0–1 multi-objective linear programming problem, Asia-Pacific Journal of Operational Research). This paper considers the recession direction for a multi-objective integer linear programming (MOILP) problem and presents necessary and sufficient conditions to have unbounde...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 54  شماره 

صفحات  -

تاریخ انتشار 2008